Vapnik-Chervonenkis Dimension and (Pseudo-)Hyperplane Arrangements
نویسندگان
چکیده
An arrangement of oriented pseudohyperplanes in affine d-space defines on its set X of pseudohyperplanes a set system (or range space) (X,R), R ⊆ 2 of VCdimension d in a natural way: to every cell c in the arrangement assign the subset of pseudohyperplanes having c on their positive side, and let R be the collection of all these subsets. We investigate and characterize the range spaces corresponding to simple arrangements of pseudohyperplanes in this way; such range spaces are called pseudogeometric, and they have the property that the cardinality of R is maximum for the given VC-dimension. In general, such range spaces are called maximum, and we show that the number of ranges R ∈ R for which also X−R ∈ R, determines whether a maximum range space is pseudogeometric. Two other characterizations go via a simple duality concept and ‘small’ subspaces. The correspondence to arrangements is obtained indirectly via a new characterization of uniform oriented matroids: a range space (X,R) naturally corresponds to a uniform oriented matroid of rank |X| − d if and only if its VC-dimension is d, R ∈ R implies X − R ∈ R and |R| is maximum under these conditions.
منابع مشابه
Learning a Fuzzy Hyperplane Fat Margin Classifier with Minimum VC dimension
The Vapnik-Chervonenkis (VC) dimension measures the complexity of a learning machine, and a low VC dimension leads to good generalization. The recently proposed Minimal Complexity Machine (MCM) learns a hyperplane classifier by minimizing an exact bound on the VC dimension. This paper extends the MCM classifier to the fuzzy domain. The use of a fuzzy membership is known to reduce the effect of ...
متن کاملA Neurodynamical System for finding a Minimal VC Dimension Classifier
The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an exact bound on the Vapnik-Chervonenkis (VC) dimension. The VC dimension measures the capacity of a learning machine, and a smaller VC dimension leads to improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number u...
متن کاملQuantifying Generalization in Linearly Weighted Neural Networks
Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural ne...
متن کاملError Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions
The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...
متن کاملSign rank versus Vapnik-Chervonenkis dimension
This work studies the maximum possible sign rank of sign (N ×N)-matrices with a given Vapnik-Chervonenkis dimension d. For d = 1, this maximum is three. For d = 2, this maximum is Θ̃(N). For d > 2, similar but slightly less accurate statements hold. The lower bounds improve on previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Discrete & Computational Geometry
دوره 12 شماره
صفحات -
تاریخ انتشار 1994